299 research outputs found

    Operation of a quantum dot in the finite-state machine mode: single-electron dynamic memory

    Full text link
    A single electron dynamic memory is designed based on the non-equilibrium dynamics of charge states in electrostatically-defined metallic quantum dots. Using the orthodox theory for computing the transfer rates and a master equation, we model the dynamical response of devices consisting of a charge sensor coupled to either a single and or a double quantum dot subjected to a pulsed gate voltage. We show that transition rates between charge states in metallic quantum dots are characterized by an asymmetry that can be controlled by the gate voltage. This effect is more pronounced when the switching between charge states corresponds to a Markovian process involving electron transport through a chain of several quantum dots. By simulating the dynamics of electron transport we demonstrate that the quantum box operates as a finite-state machine that can be addressed by choosing suitable shapes and switching rates of the gate pulses. We further show that writing times in the ns range and retention memory times six orders of magnitude longer, in the ms range, can be achieved on the double quantum dot system using experimentally feasible parameters thereby demonstrating that the device can operate as a dynamic single electron memory.Comment: 18 pages, 8 figure

    Pragmatic Ontology Evolution: Reconciling User Requirements and Application Performance

    Get PDF
    Increasingly, organizations are adopting ontologies to describe their large catalogues of items. These ontologies need to evolve regularly in response to changes in the domain and the emergence of new requirements. An important step of this process is the selection of candidate concepts to include in the new version of the ontology. This operation needs to take into account a variety of factors and in particular reconcile user requirements and application performance. Current ontology evolution methods focus either on ranking concepts according to their relevance or on preserving compatibility with existing applications. However, they do not take in consideration the impact of the ontology evolution process on the performance of computational tasks – e.g., in this work we focus on instance tagging, similarity computation, generation of recommendations, and data clustering. In this paper, we propose the Pragmatic Ontology Evolution (POE) framework, a novel approach for selecting from a group of candidates a set of concepts able to produce a new version of a given ontology that i) is consistent with the a set of user requirements (e.g., max number of concepts in the ontology), ii) is parametrised with respect to a number of dimensions (e.g., topological considerations), and iii) effectively supports relevant computational tasks. Our approach also supports users in navigating the space of possible solutions by showing how certain choices, such as limiting the number of concepts or privileging trendy concepts rather than historical ones, would reflect on the application performance. An evaluation of POE on the real-world scenario of the evolving Springer Nature taxonomy for editorial classification yielded excellent results, demonstrating a significant improvement over alternative approaches

    On the Construction of Human-Automation Interfaces by Formal Abstraction

    Full text link
    In this paper we present a formal methodology and an algorithmic procedure for constructing human-auto-mation interfaces and corresponding user-manuals. Our focus is the information provided to the user about the behavior of the underlying machine, rather than the graphical and layout features of the interface itself. Our approach involves a systematic reduction of the behavioral model of the machine, as well as systematic abstraction of information that displayed in the inter-face. This reduction procedure satisfies two require-ments: First, the interface must be correct so as not to cause mode confusion that may lead the user to per-form incorrect actions. Secondly, the interface must be as simple as possible and not include any unnecessary information. The algorithm for generating such inter-faces can be automated, and a preliminary software system for its implementation has been developed

    Learning Moore Machines from Input-Output Traces

    Full text link
    The problem of learning automata from example traces (but no equivalence or membership queries) is fundamental in automata learning theory and practice. In this paper we study this problem for finite state machines with inputs and outputs, and in particular for Moore machines. We develop three algorithms for solving this problem: (1) the PTAP algorithm, which transforms a set of input-output traces into an incomplete Moore machine and then completes the machine with self-loops; (2) the PRPNI algorithm, which uses the well-known RPNI algorithm for automata learning to learn a product of automata encoding a Moore machine; and (3) the MooreMI algorithm, which directly learns a Moore machine using PTAP extended with state merging. We prove that MooreMI has the fundamental identification in the limit property. We also compare the algorithms experimentally in terms of the size of the learned machine and several notions of accuracy, introduced in this paper. Finally, we compare with OSTIA, an algorithm that learns a more general class of transducers, and find that OSTIA generally does not learn a Moore machine, even when fed with a characteristic sample

    What Shall I Do Next? Intention Mining for Flexible Process Enactment

    No full text
    International audienceBesides the benefits of flexible processes, practical implementations of process aware information systems have also revealed difficulties encountered by process participants during enactment. Several support and guidance solutions based on process mining have been proposed, but they lack a suitable semantics for human reasoning and decisions making as they mainly rely on low level activities. Applying design science, we created FlexPAISSeer, an intention mining oriented approach, with its component artifacts: 1) IntentMiner which discovers the intentional model of the executable process in an unsupervised manner; 2) In-tentRecommender which generates recommendations as intentions and confidence factors, based on the mined intentional process model and probabilistic calculus. The artifacts were evaluated in a case study with a Netherlands software company, using a Childcare system that allows flexible data-driven process enactment

    Optimal feature selection for classifying a large set of chemicals using metal oxide sensors

    Get PDF
    Using linear support vector machines, we investigated the feature selection problem for the application of all-against-all classification of a set of 20 chemicals using two types of sensors, classical doped tin oxide and zeolite-coated chromium titanium oxide sensors. We defined a simple set of possible features, namely the identity of the sensors and the sampling times and tested all possible combinations of such features in a wrapper approach. We confirmed that performance is improved, relative to previous results using this data set, by exhaustive comparison of these feature sets. Using the maximal number of different sensors and all available data points for each sensor does not necessarily yield the best results, even for the large number of classes in this problem. We contrast this analysis, using exhaustive screening of simple feature sets, with a number of more complex feature choices and find that subsampled sets of simple features can perform better. Analysis of potential predictors of classification performance revealed some relevance of clustering properties of the data and of correlations among sensor responses but failed to identify a single measure to predict classification success, reinforcing the relevance of the wrapper approach used. Comparison of the two sensor technologies showed that, in isolation, the doped tin oxide sensors performed better than the zeolite-coated chromium titanium oxide sensors but that mixed arrays, combining both technologies, performed best

    Gene encoder: a feature selection technique through unsupervised deep learning-based clustering for large gene expression data

    Get PDF
    © 2020, Springer-Verlag London Ltd., part of Springer Nature. Cancer is a severe condition of uncontrolled cell division that results in a tumor formation that spreads to other tissues of the body. Therefore, the development of new medication and treatment methods for this is in demand. Classification of microarray data plays a vital role in handling such situations. The relevant gene selection is an important step for the classification of microarray data. This work presents gene encoder, an unsupervised two-stage feature selection technique for the cancer samples’ classification. The first stage aggregates three filter methods, namely principal component analysis, correlation, and spectral-based feature selection techniques. Next, the genetic algorithm is used, which evaluates the chromosome utilizing the autoencoder-based clustering. The resultant feature subset is used for the classification task. Three classifiers, namely support vector machine, k-nearest neighbors, and random forest, are used in this work to avoid the dependency on any one classifier. Six benchmark gene expression datasets are used for the performance evaluation, and a comparison is made with four state-of-the-art related algorithms. Three sets of experiments are carried out to evaluate the proposed method. These experiments are for the evaluation of the selected features based on sample-based clustering, adjusting optimal parameters, and for selecting better performing classifier. The comparison is based on accuracy, recall, false positive rate, precision, F-measure, and entropy. The obtained results suggest better performance of the current proposal
    • …
    corecore